Coordinate Descent Optimization for l Minimization with Application to Compressed Sensing; a Greedy Algorithm

نویسندگان

  • Yingying Li
  • Stanley Osher
چکیده

We propose a fast algorithm for solving the Basis Pursuit problem, minu{|u|1, : Au = f}, which has application to compressed sensing. We design an efficient method for solving the related unconstrained problem minu E(u) = |u|1 +λ‖Au− f‖ 2 2 based on a greedy coordinate descent method. We claim that in combination with a Bregman iterative method, our algorithm will achieve a solution with speed and accuracy competitive with some of the leading methods for the basis pursuit problem.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Feature Clustering for Accelerating Parallel Coordinate Descent

Large-scale `1-regularized loss minimization problems arise in high-dimensional applications such as compressed sensing and high-dimensional supervised learning, including classification and regression problems. High-performance algorithms and implementations are critical to efficiently solving these problems. Building upon previous work on coordinate descent algorithms for `1-regularized probl...

متن کامل

Stochastic Greedy Methods with Sparse Constraints

Motivated by recent work on stochastic gradient descent methods, we develop two stochastic variants of greedy algorithms for optimization problems with sparsity constraints. We prove linear convergence in expectation to the solution within a specified tolerance. This framework applies to problems such as sparse signal recovery in compressed sensing and lowrank matrix recovery, giving methods wi...

متن کامل

Doubly Greedy Primal-Dual Coordinate Descent for Sparse Empirical Risk Minimization

We consider the popular problem of sparse empirical risk minimization with linear predictors and a large number of both features and observations. With a convex-concave saddle point objective reformulation, we propose a Doubly Greedy PrimalDual Coordinate Descent algorithm that is able to exploit sparsity in both primal and dual variables. It enjoys a low cost per iteration and our theoretical ...

متن کامل

Convergence Analysis of the Approximate Proximal Splitting Method for Non-Smooth Convex Optimization

Consider a class of convex minimization problems for which the objective function is the sum of a smooth convex function and a non-smooth convex regularity term. This class of problems includes several popular applications such as compressive sensing and sparse group LASSO. In this thesis, we introduce a general class of approximate proximal splitting (APS) methods for solving such minimization...

متن کامل

A coordinate gradient descent method for ℓ1-regularized convex minimization

In applications such as signal processing and statistics, many problems involve finding sparse solutions to under-determined linear systems of equations. These problems can be formulated as a structured nonsmooth optimization problems, i.e., the problem of minimizing `1-regularized linear least squares problems. In this paper, we propose a block coordinate gradient descent method (abbreviated a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009